ennemi: Non-linear correlation detection with mutual information

نویسندگان

چکیده

We present ennemi, a Python package for correlation analysis based on mutual information (MI). MI is measure of relationship between variables. Unlike Pearson it valid also non-linear relationships, yet in the linear case two are equivalent. The effect other variables can be removed like with partial correlation, same equivalence. These features make better exploratory many variable pairs. Our provides methods common tasks using MI. It scalable, integrated data science ecosystem, and requires minimal configuration.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Boolean functions: noise stability, non-interactive correlation, and mutual information

Let ǫ ∈ [0, 1/2] be the noise parameter and p > 1. We study the isoperimetric problem that for fixed mean Ef which Boolean function f maximizes the p-th moment E(Tǫf) p of the noise operator Tǫ acting on Boolean functions f : {0, 1} n 7→ {0, 1}. Our findings are: in the low noise scenario, i.e., ǫ is small, the maximum is achieved by the lexicographical function; in the high noise scenario, i.e...

متن کامل

Shot boundary detection with mutual information

We present a novel approach for shot boundary detection that uses mutual information (MI) and affine image registration. The MI measures the statistical difference between consecutive frames, while the applied affine registration compensates for camera panning and zooming. Results for different sequences are presented to illustrate the motion and zoom compensation and the robustness of MI to il...

متن کامل

Correlation Distance and Bounds for Mutual Information

The correlation distance quantifies the statistical independence of two classical or quantum systems, via the distance from their joint state to the product of the marginal states. Tight lower bounds are given for the mutual information between pairs of two-valued classical variables and quantum qubits, in terms of the corresponding classical and quantum correlation distances. These bounds are ...

متن کامل

Mutual Information Functions Versus Correlation Functions

This paper studies one application of mutual information to symbolic sequence: the mutual information function M(d). This function is compared with the more frequently used correlation function (d). An exact relation between M(d) and (d) is derived for binary sequences. For sequences with more than two symbols, no such general relation exists; in particular, (d) = 0 may or may not lead to M(d) ...

متن کامل

Mutual Information Correlation with Human Vision in Medical Image Compression

Background The lossy compression algorithm produces different results in various con-trasts areas. Low contrast area image quality declines greater than that of high contrast regions using equal compression ratio. These results were obtained in a subjective study. The objective image quali-ty metrics are more effective if the calculation method is more closely related to the human vision re-sul...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SoftwareX

سال: 2021

ISSN: ['2352-7110']

DOI: https://doi.org/10.1016/j.softx.2021.100686